Improving Naive Bayes Classifier Using Conditional Probabilities
نویسندگان
چکیده
Naive Bayes classifier is the simplest among Bayesian Network classifiers. It has shown to be very efficient on a variety of data classification problems. However, the strong assumption that all features are conditionally independent given the class is often violated on many real world applications. Therefore, improvement of the Naive Bayes classifier by alleviating the feature independence assumption has attracted much attention. In this paper, we develop a new version of the Naive Bayes classifier without assuming independence of features. The proposed algorithm approximates the interactions between features by using conditional probabilities. We present results of numerical experiments on several real world data sets, where continuous features are discretized by applying two different methods. These results demonstrate that the proposed algorithm significantly improve the performance of the Naive Bayes classifier, yet at the same time maintains its robustness.
منابع مشابه
Learning the naive Bayes classifier with optimization models
Naive Bayes is among the simplest probabilistic classifiers. It often performs surprisingly well in many real world applications, despite the strong assumption that all features are conditionally independent given the class. In the learning process of this classifier with the known structure, class probabilities and conditional probabilities are calculated using training data, and then values o...
متن کاملBoosting the Tree Augmented Naïve Bayes Classifier
The Tree Augmented Naïve Bayes (TAN) classifier relaxes the sweeping independence assumptions of the Naïve Bayes approach by taking account of conditional probabilities. It does this in a limited sense, by incorporating the conditional probability of each attribute given the class and (at most) one other attribute. The method of boosting has previously proven very effective in improving the per...
متن کاملPerplexed Bayes Classifier
Naive Bayes classifiers estimate posterior probabilities poorly (Zhang, 2004). In this paper, we propose a modification to the Naive Bayes classification algorithm which improves the classifier’s posterior probability estimates without affecting its performance. Since the modification involves the use of the reciprocal of the perplexity of the class-conditional feature probabilities, we call th...
متن کاملNaive Bayes and Text Classification I - Introduction and Theory
2 Naive Bayes Classification 3 2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Posterior Probabilities . . . . . . . . . . . . . . . . . . . . . . . . 3 2.3 Class-conditional Probabilities . . . . . . . . . . . . . . . . . . . 5 2.4 Prior Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.5 Evidence . . . . . . . . . . . . . . . . . . . . . . ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011